Casual BackPropagation Through Time for Locally Recurrent Neural Networks

نویسندگان

  • Paolo Campolucci
  • Aurelio Uncini
  • Francesco Piazza
چکیده

This paper concerns dynamic neural networks for signal processing: architectural issues are considered but the paper focuses on learning algorithms that work on-line. Locally recurrent neural networks, namely MLP with IIR synapses and generalization of Local Feedback MultiLayered Networks (LF MLN), are compared to more traditional neural networks, i.e. static MLP with input and/or output buffer (TDNN), FIR MLP and fully recurrent neural networks: simulations results are provided to compare locally recurrent neural networks with TDNN and FIR MLP. Moreover, we propose a new learning algorithm, based on the Back Propagation Through Time and called Causal Back Propagation Through Time (CBPTT), that is faster and more stable than the algorithm previously used for IIR MLP. The algorithm that we propose includes as particular cases the following algorithms: Wan's Temporal Back Propagation, Back Propagation for Sequences (BPS) and Back-Tsoi algorithm.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Unifying View of Gradient Calculations and Learning for Locally Recurrent Neural Networks

In this paper a critical review of gradient-based training methods for recurrent neural networks is presented including Back Propagation Through Time (BPTT), Real Time Recurrent Learning (RTRL) and several specific learning algorithms for different locally recurrent architectures. From this survey it comes out the need for a unifying view of all the specific procedures proposed for networks wit...

متن کامل

Causal Back Propagation through Time for Locally Recurrent Neural Networks

This paper concerns dynamic neural networks for signal processing: architectural issues are considered but the paper focuses on learning algorithms that work on-line. Locally recurrent neural networks, namely MLP with IIR synapses and generalization of Local Feedback MultiLayered Networks (LF MLN), are compared to more traditional neural networks, i.e. static MLP with input and/or output buffer...

متن کامل

On-line learning algorithms for locally recurrent neural networks

This paper focuses on on-line learning procedures for locally recurrent neural networks with emphasis on multilayer perceptron (MLP) with infinite impulse response (IIR) synapses and its variations which include generalized output and activation feedback multilayer networks (MLN's). We propose a new gradient-based procedure called recursive backpropagation (RBP) whose on-line version, causal re...

متن کامل

DRAFT OF July 20 , 1995 FOR IEEE TRANSACTIONS ON NEURAL NETWORKS 1 Gradient Calculations for Dynamic

| We survey learning algorithms for recurrent neural networks with hidden units, and put the various techniques into a common framework. We discuss xedpoint learning algorithms, namely recurrent backpropagation and deterministic Boltzmann Machines, and non-xedpoint algorithms , namely backpropagation through time, Elman's history cutoo, and Jordan's output feedback architecture. Forward propaga...

متن کامل

Extension of Backpropagation through Time for Segmented-memory Recurrent Neural Networks

We introduce an extended Backpropagation Through Time (eBPTT) learning algorithm for SegmentedMemory Recurrent Neural Networks. The algorithm was compared to an extension of the Real-Time Recurrent Learning algorithm (eRTRL) for these kind of networks. Using the information latching problem as benchmark task, the algorithms’ ability to cope with the learning of long-term dependencies was tested...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1996